The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
The papers in this volume provide us with a snapshot of the variety of issues that must be considered when applying AI technologies to projects involving people with disabilities. Here a focus is on controlling interfaces and language issues. The AI technologies used are quite varied; the resulting ideas have a great deal of promise and point us to future possibilities.
We present an approach of iconic language design for people with significant speech and multiple impairments (SSMI), based upon the theory of Icon Algebra and the theory of Conceptual Dependency (CD) to derive the semantics of iconic sentences. A knowledge based design environment supporting the phases of this approach is described.
This work presents a method for translation of American Sign Language (ASL) to English using a feature-based lexicon, designed to exploit ASL's phonology by searching the lexicon for the sign's manual and non-manual information. Manual sign information consists of phonemes sig (movement), tab (location), dez (handshape), and ori (hand orientation), which we use as the ASL unit of analysis. Non-manual...
EagleEyes is a system that allows the user to control the computer through electrodes placed on the head. For people without disabilities it takes 15 to 30 minutes to learn to control the cursor sufficiently to spell out a message with an onscreen keyboard. We currently are working with two dozen children with profound disabilities to teach them to use EagleEyes to control computer software for entertainment,...
People with severe speech and motor impairments (SSMI) can often use augmentative communication devices to help them communicate. While these devices can provide speech synthesis or text output, the rate of communication is typically very slow. Consequently, augmentative communication users often develop telegraphic patterns of language usage. A natural language processing technique termed compansion...
This paper considers ways in which a person can cue and constrain an artificial agent's attention to salient features. In one experiment, a person uses gestures to direct an otherwise autonomous robot hand through a known task. Each gesture instantiates the key spatial and intentional features for the task at that moment in time. In a second hypothetical task, a person uses speech and gesture to assist...
Modern wearable computer designs package workstation level performance in systems small enough to be worn as clothing. These machines enable technology to be brought where it is needed the most for the handicapped: everyday mobile environments. This paper describes a research effort to make a wearable computer that can recognize (with the possible goal of translating) sentence level American Sign...
In this paper, we present a prototype translation system named SYUWAN which translates Japanese into Japanese sign language. One of the most important problems in this task is that there are very few entries in a sign language dictionary compared with a Japanese one. To solve this problem, when the original input word does not exist in a sign language dictionary SYUWAN applies several techniques to...
Many people with severe speech and motor impairments make use of augmentative and alternative communication (AAC) systems. These systems can employ a variety of techniques to organize stored words, phrases, and sentences, and to make them available to the user. It is argued in this chapter that an AAC system should make better use of the regularities in an individual's conversational experiences and...
Intelligent robotics is the study of how to make machines that can exhibit many of the qualities of people. This seems a very appropriate technology to use to assist those people who have lost certain abilities that are common to the majority of the population. This paper gives an overview of some of the robotic technology that has been or is being developed to assist people with disabilities.
We aim to develop a robot which can be commanded simply and accurately, especially by users with reduced mobility. Our shared control approach divides task responsibilities between the user (high level) and the robot (low level). A video interface shared between the user and robot enables the use of a deictic interface. The paper describes our progress toward this goal in several areas. A complete...
A brief survey of research in the development of autonomy in wheelchairs is presented and AAI's R&D to build a series of intelligent autonomous wheelchairs is discussed. A standardized autonomy management system that can be installed on readily available power chairs which have been well-engineered over the years has been developed and tested. A behavior-based approach was used to establish sufficient...
This paper describes the goals and research directions of the University of Texas Artificial Intelligence Lab's Intelligent Wheelchair Project (IWP). The IWP is a work in progress. The authors are part of a collaborative effort to bring expertise from knowledge representation, control, planning, and machine vision to bear on this difficult and interesting problem domain. Our strategy uses knowledge...
The Multimodal User Supervised Interface and Intelligent Control (MUSIIC) project addresses the issue of telemanipulation of everyday objects in an unstructured environment. Telerobot control by individuals with physical limitations pose a set of challenging problems that need to be resolved. MUSIIC addresses these problems by integrating a speech and gesture driven human-machine interface with a...
People with both visual and mobility impairments have great difficulty using conventional mobility aids for the blind. As a consequence they have little opportunity to take exercise without assistance from a carer. The combination of visual and mobility impairments occurs most often among the elderly. In this paper we examine the issues related to mobility for the blind and pay particular attention...
A Robotic Travel Aid (RoTA) is a motorized wheelchair equipped with vision, sonar, and tactile sensors and a map database system. A RoTA can provide a visually impaired user assistance with orientation and obstacle avoidance, as well as information about their present location, landmarks, and the route being followed. In this paper we describe HITOMI, an implementation of the RoTA concept that can...
Many people in wheelchairs are unable to control a powered wheelchair with the standard joystick interface. A robotic wheelchair can provide users with driving assistance, taking over low-level navigation to allow its user to travel efficiently and with greater ease. Our robotic wheelchair system, Wheelesley, consists of a standard powered wheelchair with an on-board computer, sensors and a graphical...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.